Big Tech Has Banned Trump. Now What?
2021-01-16
LRC
TXT
大字
小字
滚动
全页
1As the world accepts a Twitter without @realdonaldtrump, the big question is: "Now what?"
2Major technology companies have long been accused of giving President Donald Trump special treatment that other users did not receive.
3Now, tech companies have banned Trump from their platforms after a mob led by his supporters attacked the U.S. Capitol on January 6.
4Trump was blocked from Twitter, Facebook, Snapchat and other social media platforms.
5In many ways, removing the president was the easy part. But what happens now?
6Will tech companies hold other world leaders to the same level of behavior?
7Will they go further into deciding what is and is not permitted on their platforms, perhaps angering many of their users?
8Will all this cause additional online divisions that will push people with extreme ideas onto secret platforms?
9Although they've long tried to remain neutral, Facebook, Twitter and other platforms are slowly finding that they can play an active part in shaping the modern world.
10Their services are used by many angry groups as well as people pushing misinformation about science, politics and medicine.
11The companies are moving from defending "free-speech absolutism, towards an understanding of speech moderation as a matter of public health," said media professor Ethan Zuckerman of the University of Massachusetts-Amherst.
12None of this can be fixed quickly, and banning a president with only a few more days in office is not the answer.
13But there are ways to be more effective.
14When the 26-minute video "Pl andemic" suddenly appeared on the internet, it received millions of views in just a few days.
15It was filled with untrue information that pointed to a worldwide COVID-19 conspiracy.
16Facebook, Twitter and YouTube removed it only after the video had received millions of views.
17But the companies were ready for part two of the video.
18When it appeared, it was removed immediately and received very little attention.
19"Sharing disinformation about COVID is a danger because it makes it harder for us to fight the disease," Zuckerman said.
20He added that "sharing disinformation about voting is an attack on our democracy."
21It has been easier for tech companies to act on matters of public health than on politics.
22Corporate reactions to Trump and his supporters have led to angry cries of censorship.
23Such actions even drew criticism from European leaders such as German Chancellor Angela Merkel, and she has little love for Trump.
24Merkel's spokesman, Steffen Seibert, said freedom of opinion is one of our most basic rights.
25He told reporters that such a right can only be removed or changed by governments, not by "a decision by the management of social media platforms."
26That may be possible in Europe, but it is much more complex in the U.S., where the First Amendment of the U.S. Constitution protects freedom of expression from government rules.
27However, it does not protect freedom of expression from corporate rules on privately-owned, communication platforms.
28Governments, of course, remain free to regulate tech companies.
29Over the past year, Trump, other Republicans and some Democrats have called for the removal of a 1996 law known as Section 230.
30The law protects social media platforms from being sued for a lot of money by anyone who feels wronged by something someone else has posted.
31Still, few are happy with the often slow reactions of companies like Twitter and Facebook to events like the U.S. Capitol attack, other violent events or live-streamed shootings.
32Sarita Schoenebeck is a University of Michigan professor who studies online harassment.
33She said it might be time for the platforms to reexamine how they react to problematic material.
34Until recently, tech companies have looked only at problematic material on its own.
35They have not thought about "the broader social and cultural" effect, she said.
36She added that companies should look at democratic ideals, community governance and platform rules to "shape behavior."
37Jared Schroeder is an expert on social media and the First Amendment at Southern Methodist University.
38He thinks the Trump bans will push supporters to more secretive platforms where "they can organize and communicate."
39"The bans have taken away the best tools for organizing people and for Trump to speak to the largest audiences, but these are by no means the only tools," Schroeder said.
40I'm Susan Shand.
1As the world accepts a Twitter without @realdonaldtrump, the big question is: "Now what?" 2Major technology companies have long been accused of giving President Donald Trump special treatment that other users did not receive. Now, tech companies have banned Trump from their platforms after a mob led by his supporters attacked the U.S. Capitol on January 6. 3Trump was blocked from Twitter, Facebook, Snapchat and other social media platforms. In many ways, removing the president was the easy part. But what happens now? 4Will tech companies hold other world leaders to the same level of behavior? Will they go further into deciding what is and is not permitted on their platforms, perhaps angering many of their users? Will all this cause additional online divisions that will push people with extreme ideas onto secret platforms? 5Although they've long tried to remain neutral, Facebook, Twitter and other platforms are slowly finding that they can play an active part in shaping the modern world. Their services are used by many angry groups as well as people pushing misinformation about science, politics and medicine. 6The companies are moving from defending "free-speech absolutism, towards an understanding of speech moderation as a matter of public health," said media professor Ethan Zuckerman of the University of Massachusetts-Amherst. 7None of this can be fixed quickly, and banning a president with only a few more days in office is not the answer. 8But there are ways to be more effective. 9When the 26-minute video "Pl andemic" suddenly appeared on the internet, it received millions of views in just a few days. It was filled with untrue information that pointed to a worldwide COVID-19 conspiracy. Facebook, Twitter and YouTube removed it only after the video had received millions of views. But the companies were ready for part two of the video. When it appeared, it was removed immediately and received very little attention. 10"Sharing disinformation about COVID is a danger because it makes it harder for us to fight the disease," Zuckerman said. He added that "sharing disinformation about voting is an attack on our democracy." 11It has been easier for tech companies to act on matters of public health than on politics. Corporate reactions to Trump and his supporters have led to angry cries of censorship. Such actions even drew criticism from European leaders such as German Chancellor Angela Merkel, and she has little love for Trump. 12Merkel's spokesman, Steffen Seibert, said freedom of opinion is one of our most basic rights. He told reporters that such a right can only be removed or changed by governments, not by "a decision by the management of social media platforms." 13That may be possible in Europe, but it is much more complex in the U.S., where the First Amendment of the U.S. Constitution protects freedom of expression from government rules. However, it does not protect freedom of expression from corporate rules on privately-owned, communication platforms. 14Governments, of course, remain free to regulate tech companies. Over the past year, Trump, other Republicans and some Democrats have called for the removal of a 1996 law known as Section 230. The law protects social media platforms from being sued for a lot of money by anyone who feels wronged by something someone else has posted. 15Still, few are happy with the often slow reactions of companies like Twitter and Facebook to events like the U.S. Capitol attack, other violent events or live-streamed shootings. 16Sarita Schoenebeck is a University of Michigan professor who studies online harassment. She said it might be time for the platforms to reexamine how they react to problematic material. 17Until recently, tech companies have looked only at problematic material on its own. They have not thought about "the broader social and cultural" effect, she said. She added that companies should look at democratic ideals, community governance and platform rules to "shape behavior." 18Jared Schroeder is an expert on social media and the First Amendment at Southern Methodist University. He thinks the Trump bans will push supporters to more secretive platforms where "they can organize and communicate." 19"The bans have taken away the best tools for organizing people and for Trump to speak to the largest audiences, but these are by no means the only tools," Schroeder said. 20I'm Susan Shand. 21The Associated Press reported this story. Susan Shand adapted it for Learning English. Bryan Lynn was the editor. 22________________________________________________________________ 23Words in This Story 24platform - n. something that allows someone to tell a large number of people about an idea, product, 25absolutism - n. a philosophy of never making exceptions 26conspiracy - n. a secret plan made by two or more people to do something that is harmful or illegal 27censorship - n. a system of examining books, movies, letters, etc., and removes things that are considered to be offensive, immoral, harmful to society 28management - n. the control or organization of something 29regulate - v. to make rules or laws that control something 30sue - v. to use a legal process by which you try to get a court of law to force a person, company, or organization that has treated you unfairly or hurt you in some way to give you something or to do something 31live-stream - v. to put on the internet pictures of events as they happen 32harassment - n. behavior that annoys or troubles someone 33We want to hear from you. Write to us in the Comments Section, and visit our Facebook page.